Goto

Collaborating Authors

 robotic camera


Understanding Generative AI in Robot Logic Parametrization

Hwang, Yuna, Sato, Arissa J., Praveena, Pragathi, White, Nathan Thomas, Mutlu, Bilge

arXiv.org Artificial Intelligence

Leveraging generative AI (for example, Large Language Models) for language understanding within robotics opens up possibilities for LLM-driven robot end-user development (EUD). Despite the numerous design opportunities it provides, little is understood about how this technology can be utilized when constructing robot program logic. In this paper, we outline the background in capturing natural language end-user intent and summarize previous use cases of LLMs within EUD. Taking the context of filmmaking as an example, we explore how a cinematography practitioner's intent to film a certain scene can be articulated using natural language, captured by an LLM, and further parametrized as low-level robot arm movement. We explore the capabilities of an LLM interpreting end-user intent and mapping natural language to predefined, cross-modal data in the process of iterative program development. We conclude by suggesting future opportunities for domain exploration beyond cinematography to support language-driven robotic camera navigation.


Alligator spotted roaming Florida city's underground stormwater pipes with robotic camera

FOX News

Crews in Oviedo, Florida, were investigating underground pipes for anomalies when their robotic camera ran into a 5-foot alligator. A city crew in Florida spotted a 5-foot alligator lurking in a stormwater pipe while investigating the pipes with a robotic camera last week, officials said Tuesday. The stormwater crew in the city of Oviedo, located about 20 miles northeast of Orlando, was on Lockwood Boulevard to check on a series of potholes that appeared in the roadway on Friday, the city said in a Facebook post. The crew used a four-wheel robotic camera to go into the pipes below the road and investigate any anomalies such as leaking pipes, cracks or other defects underground, officials said. However, crews instead found a different kind of anomaly while searching the underground pipes.


Exploring the Use of Collaborative Robots in Cinematography

Praveena, Pragathi, Cagiltay, Bengisu, Gleicher, Michael, Mutlu, Bilge

arXiv.org Artificial Intelligence

Robotic technology can support the creation of new tools that improve the creative process of cinematography. It is crucial to consider the specific requirements and perspectives of industry professionals when designing and developing these tools. In this paper, we present the results from exploratory interviews with three cinematography practitioners, which included a demonstration of a prototype robotic system. We identified many factors that can impact the design, adoption, and use of robotic support for cinematography, including: (1) the ability to meet requirements for cost, quality, mobility, creativity, and reliability; (2) the compatibility and integration of tools with existing workflows, equipment, and software; and (3) the potential for new creative opportunities that robotic technology can open up. Our findings provide a starting point for future co-design projects that aim to support the work of cinematographers with collaborative robots.


Lutz company uses a robotic camera for a coronavirus-safer production studio – IAM Network

#artificialintelligence

Diamond View Studios has been taking precautions to protect employees and clients during the pandemic.Everyone entering the 10,000-square-foot television, film and video production facility at 1616 E Bearss Ave. in Lutz first has their temperature taken and then must fill out a questionnaire asking if they have coronavirus symptoms or have been exposed recently to anyone who has.Half the 28-person staff is working from home and the other half, plus clients, must always wear masks inside.Diamond View -- clients include the Atlanta Braves and the University of Florida -- took precautions a step further early in July. They created a more coronavirus-safe 4,000-square-foot studio that uses a robotic camera operated via remote control from an adjoining room. There's no crew on set when the cast is saying lines.This allows the cast to remove masks with less concern that they will contract the coronavirus. They leave the set when crew must tinker with lights or the camera.And a green screen allows Diamond View to turn the studio into anyplace in the world."We


Video Friday: Waffle Robots, Laser vs. Drone, and TurtleBot Tutorials

IEEE Spectrum Robotics

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. That was pretty good but doesn't quite stack up against this old ABB pancake robot video, which will always be my favorite: The title of this video is "Laser Dune Buggy vs. Drone." Slightly underwhelming, maybe, but it does have an effective range of a mile, which is impressive.


BBC series uses robot creatures to document secret lives of animals

#artificialintelligence

What does a newly hatched crocodile see while it is being transported to water between its mother's jaws? How should a wild dog pup behave if it wants to be accepted by an approaching pack of adults? These and other questions will be answered in a new BBC wildlife series screening this week, in which the stars of the show are not only the animals being filmed, but the animatronic "spy creatures" used to film them. Spy in the Wild is the BBC's first major natural history series since Planet Earth II, but the footage that makes up the five-part series was captured in a very different way to Sir David Attenborough's wildlife spectacular. Using 30 remote-controlled robotic animals, each concealing miniature cameras, programme-makers captured footage they say is among some of the most intimate and revealing to date, showing a range of animal behaviours that appear to demonstrate grief, friendship and even empathy with other species.


Military set to develop smart, robotic cameras

AITopics Original Links

In a move seemingly strait out of the Terminator movies, the Defense Advanced Research Projects Agency this week said it has contracted with 15 companies or universities to begin building software and hardware that will give machines or robots visual intelligence similar to humans. DARPA said the program, known as Mind's Eye, should generate the ability for machines to have the "perceptual and cognitive abilities for recognizing and reasoning about the actions it sees and report or act upon it." "Humans perform a wide range of visual tasks with ease, something no current artificial intelligence can do in a robust way. They have inherently strong spatial judgment and are able to learn new spatiotemporal concepts directly from the visual experience. Humans visualize scenes and objects, as well as the actions involving those objects and possess a powerful ability to manipulate those imagined scenes mentally to solve problems. A machine-based implementation of such abilities is broadly applicable to a wide range of applications, including ground surveillance," DARPA stated.